Research

This is the overview of our research​​.

Current Research

SPDEs are partial differential equations with random terms which are due to uncertainty in the models. They arise in many multidimensional physical problems. Examples for the source of uncertainty include the variability of soil permeability in subsurface aquifers and heterogeneity of materials with microstructure.  We work on the analysis and computation of elliptic, parabolic and hyperbolic equations with random data. ​​   ​​
Data assimilation, or filtering, refers to the problem of combining noisy observations of a (typically physical) system together with a model for that system in order to infer the state and/or parameters online as data is received.  In the probabilistic context of a hidden-Markov model, this leads to a recursion of Bayesian updates.  The objective of the filtering problem is then to obtain the posterior distribution of the unknown as a function of the history of observations.  One objective of this research is (i) to obtain better approximations of this distribution.  This is often not possible in practice and one may therefor aim to obtain an estimator (and some coarse measure of spread) which tracks the truth rather than aiming for the full distribution.  Another objective of this research is (ii) to analyze existing commonly used filtering algorithms from this perspective to determine their accuracy and stability properties.  
Inspired by the recent development of pCN and other function-space MCMC samplers, and also the recent independent development of Riemann manifold methods and stochastic Newton methods, we propose a class of algorithms which combine the benefits of both, yielding various dimension-independent, likelihood-informed (DILI) sampling algorithms.  These algorithms are very effective at obtaining minimally-correlated samples from very high-dimensional distributions.
We are interested in developing a multiscale inference framework for pure jump processes. This allows to gradually incorporate information and improve the estimations, by using simpler models.
​​Fatigue tests at different speeds (cycles per minute) all along a range of mean loads are performed to determine the fatigue strength of a certain kind of metallic specimen. We provide a rigorous statistical framework to support making engineering decision for model calibration, Bayesian model selection and validation for metallic fatigue data.​
​We are currently developing hierarchical Bayesian techniques to infer the unknown coefficients in initial-boundary value problems (IBVPs) for linear parabolic partial differential equations.
The study of subsurface flows is an important field of research for many classical and modern applications such as oil extraction, carbon sequestration and storage, geothermal energy, groundwater remediation, etc. However there is still a high uncertainty in the definition of many important parameters and in the validity of simplified models commonly used in these applications. We want to tackle this problem with a special attention to the application of carbon storage.​​​​​
Experimental design is an important topic in engineering and sciences that pertains to model calibration, validation, and predictive modeling.  We aim to systematically develop fast methodology of experimental design, information extraction and decision making for large scale systems under extreme conditions. E.g., high pressure shock tube experiments, large scale oil reservoir management, seismic wave propagation, impedance tomography measurements.                                                              ​​​ ​​​​​​​​​​​​
The ensemble Kalman filter is a popular Monte Carlo filtering algorithm.  In the case that the system can be approximated at a range of resolutions with increasing cost, then ensembles at multiple resolutions can be utilized to significantly accelerate convergence with a multilevel Monte Carlo strategy.​​​​
Multi index methods are based on Sparse Grid methods and utilize the extra mixed regularity between dimensions (spatial or stochastic) to reduce the work complexity of different methods. In fact, in some cases we may get the rate of work complexity that is independent of the number of dimensions. We have developed a Mutli Index variant of the Monte Carlo (MIMC) method and we are currently developing a Multi Index variant of the Stochastic Collocation (MISC) method. ​
The MLMC method uses a systematic form of variance reduction to improve the convergence rate of standard Monte Carlo simulations for stochastic differential equations. ​​​
Our main motivation is to obtain accurate and fast simulations for a class of a Markovian jump processes named Stochastic Reaction Networks, which are used to model a wide range of phenomena.  ​
The goal of this project is twofold: i) adapt the Forward-reverse technique developed by Bayer and Schoenmakers for computing expected values of functionals of bridges obtained by observing diffusion processes at discrete times, ii) apply this technique in conjunction with the Expectation Maximization algorithm to produce maximum likelihood estimators of the coefficients of the propensity functions in the context of stochastic reaction networks.   ​
For better understanding earthquake distribution based on Bayesian inference​ techniques.​​
We develop a stable and fast multiresolution method for solving Radial Basis Function interpolation of datasets in 3 dimensions
Computational power poses heavy limitations to the achievable problem size for Kriging. In separate research lines, Kriging algorithms based on FFT and low-rank representations of covariance functions have been developed, both leading to drastic speedup factors. We approximate the covariance matrix in the canonical tensor format and apply convolution (exploit nice tensor properties of FFT, IFFT and Hadamar product). As a result we reduce the total computational cost from O(n^d) to O(dn).​ ​​